Technology Tales

Adventures & experiences in contemporary technology

All that was needed was a trip to a local shop

5th March 2011

In the end, I did take the plunge and acquired a Sigma 50-200 mm f4-5.6 DC OS HSM lens to fit my ever faithful Pentax K10D. After surveying a few online retailers, I plumped for Park Cameras where the total cost, including delivery, came to something to around £125. This was around £50 less than what others were quoting for the same lens with delivery costs yet to be added. Though the price was good at Park Cameras, I was wondering still about how they could manage to do that sort of deal when others don’t. Interestingly, it appears that the original price of the lens was around £300 but that may have been at launch and prices do seem to tumble after that point in the life of many products of an electrical or electronic nature.

All that was needed was a trip to a local shopUnlike the last lens that I bought from them around two years ago, delivery of this item was a prompt affair with dispatch coming the day after my order and delivery on the morning after that. All in all, that’s the kind of service that I like to get. On opening the box, I was surprised to find that the lens came with a hood but without a cap. However, that was dislodged slightly from my mind when I remembered that I neglected to order a UV or skylight filter to screw into the 55 mm front of it. In the event, it was the lack of a lens cap needed sorting more than the lack of a filter. The result was that I popped in the local branch of Wildings where I found the requisite lens cap for £3.99 and asked about a filter while I was at it. Much to my satisfaction, there was a UV filter that matched my needs in stock though it was that cheap at £18.99 and was made by a company of which I hadn’t heard before, Massa. This was another example of good service when the shop attendant juggled two customers, a gentleman looking at buying a DSLR and myself. While I would not have wanted to disturb another sales interaction, I suppose that my wanting to complete a relatively quick purchase was what got me the attention while the other customer was left to look over a camera, something that I am sure he would have wanted to do anyway. After all, who wouldn’t?

With the extras acquired, I attached them to the front of the lens and carried out a short test (with the cap removed, of course). When it was pointed at an easy subject, the autofocus worked quickly and quietly. A misty hillside had the lens hunting so much that turning to manual focussing was needed a few times to work around something understandable. Like the 18-125 mm Sigma lens that I already had, the manual focussing ring is generously proportioned with a hyperfocal scale on it though some might think the action a little loose. In my experience though, it seems no worse than the 18-125 mm so I can live with it. Both lenses share something else in common in the form of the zoom lens having a stiffer action than the focus ring. However, the zoom lock of the 18-125 mm is replaced by an OS (Optical Stabilisation) one on the 50-200 mm and the latter has no macro facility either, another feature of the shorter lens though it remains one that I cannot ever remember using. In summary, first impressions are good but I plan to continue appraising it. Maybe an outing somewhere tomorrow might offer a good opportunity for using it a little more to get more of a feeling for its performance.

Initial impressions of Windows 10

31st October 2014

Being ever curious on the technology front, the release of the first build of a Technical Preview of Windows 10 was enough to get me having a look at what was on offer. The furore regarding Windows 8.x added to the interest so I went to the download page to get a 64-bit installation ISO image.

That got installed into a fresh VirtualBox virtual machine and the process worked smoothly to give something not so far removed from Windows 8.1. However, it took until release 4.3.18 of VirtualBox before the Guest additions had caught up with the Windows prototype so I signed up for the Windows Insider program and got a 64-bit ISO image to install the Enterprise preview of Windows 10 into a VMware virtual machine since and that supported full screen display of the preview while VirtualBox caught up with it.

Of course, the most obvious development was the return of the Start Menu and it works exactly as expected too. Initially, the apparent lack of an easy way to disable App panels had me going to Classic Shell for an acceptable Start Menu. It was only later that it dawned on me that unpinning these panels would deliver to me the undistracting result that I wanted.

Another feature that attracted my interest is the new virtual desktop functionality. Here I was expecting something like what I have used on Linux and UNIX. There, each workspace is a distinct desktop with only the applications open in a given workspace showing on a panel in there. Windows does not work that way with all applications visible on the taskbar regardless of what workspace they occupy, which causes clutter. Another deficiency is not having a desktop indicator on the taskbar instead of the Task View button. On Windows 7 and 8.x, I have been a user of VirtuaWin and this still works largely in the way that I expect of it too, except for any application windows that have some persistence associated with them; the Task Manager is an example and I include some security software in the same category too.

Even so, here are some keyboard shortcuts for anyone who wants to take advantage of the Windows 10 virtual desktop feature:

  • Create a new desktop: Windows key + Ctrl + D
  • Switch to previous desktop: Windows key + Ctrl + Left arrow
  • Switch to next desktop: Windows key + Ctrl + Right arrow

Otherwise, stability is excellent for a preview of a version of Windows that is early on its road to final release. An upgrade to a whole new build went smoothly when initiated following a prompt from the operating system itself. All installed applications were retained and a new taskbar button for notifications made its appearance alongside the existing Action Centre icon. So far, I am unsure what this does and whether the Action Centre button will be replaced in the fullness of time but I am happy to await where things go with this.

All is polished up to now and there is nothing to suggest that Windows 10 will not be to 8.x what 7 was to Vista. The Start Screen has been dispatched after what has proved to be a misadventure on the part of Microsoft. The PC still is with us and touchscreen devices like tablets are augmenting it instead of replacing it for any tasks involving some sort of creation. If anything, we have seen the PC evolve with laptops perhaps becoming more like the Surface Pro, at least when it comes to hybrid devices. However, we are not as happy smudge our PC screens quite like those on phones and tablets so a return to a more keyboard and mouse centred approach for some devices is a welcome one.

What I have here are just a few observations and there is more elsewhere, including a useful article by Ed Bott on ZDNet. All in all, we are early in the process for Windows 10 and, though it looks favourable so far, I will continue to keep an eye on how it progresses. It needs to be less experimental than Windows 8.x and it certainly is less schizophrenic and should not be a major jump for users of Windows 7.

Fixing Background Image Display in GNOME Shell 3.10

2nd May 2014

On upgrading from Ubuntu GNOME 13.10 to Ubuntu GNOME 14.04, a few rough edges were to be noticed. One was the display of my chosen background image: it was garbled. Later, I discovered that there is a maximum width of 2560 px for background images in GNOME Shell these days and that things get messy beyond that.

In my case, the image width was around 6000 px and I was used to its getting resized in GNOME Shell 3.8 and its predecessors. It seems that the functionality got removed after that though so the workaround of manual image resizing in the GIMP needed to be employed. Though having big images open in memory creates an additional overhead, not handling them very well at all looks like a bug caused by setting 2560 px as a maximum screen width for the GNOME Shell panel and the complete removal of Nautilus from desktop rendering duties. Let’s hope that sense is seen with ever larger screen sizes and resolutions coming our way.

It’s the sort of thing that did get me looking at adding on Cinnamon 2.2 for a while before setting background image scaling using the indispensable GNOME Tweak Tool was discovered. LinuxG.net has a useful tutorial on this for anyone with such an adventurous streak in them. For now though, I am OK with my set up but the GNOME project’s focus on minimalism could affect us in other ways so I can see why Clem Lefebvre started the Cinnamon one primarily for Linux Mint and the desktop environment is appearing elsewhere too. After all, Gedit lost its menu bar in GNOME 3.12 so it’s just as well that we have alternative choices.

Update 2014-05-06: It seems that the desktop image bug that afflicts GNOME Shell 3.10 got sorted for GNOME Shell 3.12. At least, that is the impression that an Antergos instance in a VirtualBox virtual machine gives me.

Battery life

2nd October 2011

In recent times, I have lugged my Toshiba Equium with me while working away from home; I needed a full screen laptop of my own for attending to various things after work hours so it needs to come with me. It’s not the most portable of things with its weight and the lack of battery life. Now that I think of it, I reckon that it’s more of a desktop PC replacement machine than a mobile workhorse. After all, it only lasts an hour on its own battery away from a power socket. Virgin Trains’ tightness with such things on their Pendolinos is another matter…

Unless my BlackBerry is discounted, battery life seems to be something with which I haven’t had much luck because my Asus Eee PC isn’t too brilliant either. Without decent power management, two hours seems to be as good as I get from its battery. However, three to four hours become possible with better power management software on board. That makes the netbook even more usable though there are others out there offering longer battery life. Still, I am not tempted by these because the gadget works well enough for me that I don’t need to wonder about how money I am spending on building a mobile computing collection.

While I am not keen on spending too much cash or having a collection of computers, the battery life situation with my Toshiba is more than giving me pause for thought. The figures quoted for MacBooks had me looking at them though they aren’t at all cheap. Curiosity about the world of the Mac may make them attractive to me but the prices forestalled that and the concept was left on the shelf.

Recently, PC Pro ran a remarkably well-timed review of laptops offering long battery life (in issue 205). The minimum lifetime in this collection was over five hours so the list of reviewed devices is an interesting one for me. In fact, it even may become a shortlist should I decide to spend money on buying a more portable laptop than the Toshiba that I already have. The seventeen hour battery life for a Sony VAIO SB series sounds intriguing even if you need to buy an accessory to gain this. That it does over seven hours without the extra battery slice makes it more than attractive anyway. The review was food for thought and should come in handy if I decide that money needs spending.

Using .htaccess to control hotlinking

10th October 2020

There are times when blogs cease to exist and the only place to find the content is on the Wayback Machine. Even then, it is in danger of being lost completely. One such example is the subject of this post.

Though this website makes use of the facilities of Cloudflare for various functions that include the blocking of image hotlinking, the same outcome can be achieved using .htaccess files on Apache web servers. It may work on Nginx to a point too but there are other configuration files that ought to be updated instead of using a .htaccess when some frown upon the approach. In any case, the lines that need adding to .htaccess are listed below though the web address needs to include your own domain in place of the dummy example provided:

RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www\.)?yourdomain.com(/)?.*$ [NC]
RewriteRule .*\.(gif|jpe?g|png|bmp)$ [F,NC]

The first line turns on the mod_rewrite engine and you may have that done anyway. Of course, the module needs enabling in your Apache configuration for this to work and you have to be allowed to perform the required action as well. This means changing the Apache configuration files. The next pair of lines look at the HTTP referer strings and the third one only allows images to be served from your own web domain and not others. To add more, you need to copy the third line and change the web address accordingly. Any new lines need to precede the last line that defines the file extensions that are to be blocked to other web addresses.

RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www\.)?yourdomain.com(/)?.*$ [NC]
RewriteRule \.(gif|jpe?g|png|bmp)$ /images/image.gif [L,NC]

Another variant of the previous code involves changing the last line to display a default image showing others what is happening. That may not reduce the bandwidth usage as much as complete blocking but it may be useful for telling others what is happening.

An unexpected side effect

29th June 2007

I recently posted about using mod_rewrite to block access to your images from all but the websites to which you want access to be available. Following so doing, I discovered that my FAVICON had disappeared from Firefox’s address. As it turned out, it was easy to fix and that is covered in another recent post.

Some online writing tools

15th October 2021

Every week, I get an email newsletter from Woody’s Office Watch. This was something to which I started subscribing in the 1990’s but I took a break from it for a good while for reasons that I cannot recall and returned to it only in recent years. This week’s issue featured a list of online paraphrasing tools that are part of what is offered by Quillbot, Paraphraser, Dupli Checker and Pre Post Seo. Each got their own reviews in the newsletter so I will just outline other features in this posting.

In Quillbot’s case, the toolkit includes a grammar checker, summary generator, and citation generator. In addition to the online offering, there are extensions for Microsoft Word, Google Chrome, and Google Docs. In addition to the free version, a paid subscription option is available.

In spite of the name, Paraphraser is about more than what the title purports to do. There is article rewriting, plagiarism checking, grammar checking and text summarisation. Because there is no premium version, the offering is funded by advertising and it will not work with an ad blocker enabled. The mention of plagiarism suggests a perhaps murkier side to writing that cuts both ways: one is to avoid copying other work while another is the avoidance of groundless accusations of copying.

It was appear that the main role of Dupli Checker is to avoid accusations of plagiarism by checking what you write yet there is a grammar checker as well as a paraphrasing tool on there too. When I tried it, the English that it produced looked a little convoluted and there is a lack of fluency in what is written on its website as well. Together with a free offering that is supported by ads that were not blocked by my ad blocker, there are premium subscriptions too.

In web publishing, they say that content is king so the appearance of an option using the acronym for Search Engine Optimisation in it name may not be as strange as it might as first glance. There are numerous tools here with both free and paid tiers of service. While paraphrasing and plagiarism checking get top billing in the main menu on the home page, further inspection reveals that there is a lot more to check on this site.

In writing, inspiration is a fleeting and ephemeral quantity so anything that helps with this has to be of interest. While any rewriting of initial content may appear less smooth than the starting point, any help with the creation process cannot go amiss. For that reason alone, I might be tempted to try these tools from time to time and they might assist with proof reading as well because that can be a hit and miss affair for some.

 

Databases & Programming

29th September 2012

The world of UNIX appears to attract those interested in the more technical aspects of computing. Since Linux is cut from the same lineage, it is apt to include lists of computing languages. Both scripting and programming appear here despite the title, itself shortened for the sake of brevity. Since much code cutting involves working with databases, these appear here too.

In time, I plan to correct the imbalance between programming and scripting languages that currently exists. The original list was bare, so descriptions have been added and will be more and more needed should there be any expansion of what you find here.

Programming and Scripting Languages

Apache Groovy

My first encounter with an implementation of this language was with that belonging to a statistical computing environment (SCE) and that remains an ongoing dalliance. It is easy to think of Groovy as a way of working with a Java-based API using a scripting language and it certainly feels like that. Saying that, it all works better if you know Java, though you do have to watch for the development of domain-specific language capability. That last comment probably applies to the aforementioned SCE in that it has its own object and method hierarchy that means that not all standard Groovy functionality is available.

Clojure

Clojure is a dynamic, functional programming language that runs on the Java Virtual Machine (JVM) and is designed for building robust and scalable software applications. It is characterised by its emphasis on immutability, persistent data structures, and seamless interoperability with Java. Clojure embraces the Lisp programming language’s principles, providing a concise syntax and powerful abstractions for managing state, concurrency, and functional programming paradigms. With its focus on simplicity, expressiveness, and the ability to leverage the vast Java ecosystem, Clojure enables developers to create efficient and maintainable code for a wide range of applications.

Erlang

This is a programming language designed for building highly concurrent, fault-tolerant, and scalable systems that was developed by Ericsson in the late 1980s for telecommunication systems, where reliability and performance are critical. Erlang incorporates features such as lightweight processes, message passing, and built-in support for fault tolerance, making it well-suited for developing distributed and real-time applications. Its unique concurrency model and emphasis on fault tolerance have led to its widespread use in industries such as telecommunications, banking, gaming, and web development, where systems need to handle high loads, be resilient to failures, and provide real-time responsiveness.

Elixir

Inspired by Erlang, Elixir is a functional, concurrent programming language designed for building scalable and fault-tolerant applications. It leverages the powerful concurrency model of the Erlang Virtual Machine (BEAM) while providing a more accessible and expressive syntax. It offers features such as lightweight processes, message passing, pattern matching, and a robust ecosystem of libraries and frameworks. With its focus on reliability, performance, and ease of development, Elixir is well-suited for developing highly concurrent and distributed systems, making it a popular choice for building web applications, real-time systems, and software that requires high availability.

Go

Computing languages often get strange names like single letters or small words like this one; that means that you need to look for “Golang” in any online search. In any case, Go was originated at Google and numbered among its inventors was one of the creators of the C programming language. The intent here is massively multithreaded system programming using stand-alone executable components while retaining or enhancing code readability. Another facet is the ability to function efficiently in distributed computing environments like those at SoundCloud or Uber. A variety of different tools have been written using the language and these include the ever pervasive Docker and Kubernetes.

Julia

It remains an odd decision to give a computing language a girl’s name, but the purpose is a serious one. Often, there is a trade-off between speed of code writing and speed of execution with the result being that data programming involves prototyping in one language and porting to another for production usage. The first group includes R and Python while the second includes C, C++, FORTRAN and even Java, so there is an element of translation involved that often means that different people are involved, which adds an element of error caused by misunderstandings. This gets described as the two language problem and Julia’s major raison d’être is the avoidance of that: its top-line description is that it is as quick to program as Python but runs as fast as C because of its just-in-time compilation, multiple dispatch and in-built multithreading. This also allows for extensive capabilities for scientific computing that go beyond machine learning and an example comes in the number of differential equation solvers that are available. It also helps that meta-programming makes everything more generalisable.

Perl

It has been around since the 1980’s and still pervades though it is not as dominant as it once was for creating dynamic websites or system administration. PHP has taken on much of the former while Python is making inroads into the latter. Still, no list would be complete with complete without a mention of the once ubiquitous scripting language and it once powered my online photo gallery. It may be an easier language, but there is plenty of documentation on the web with Perldoc, Perl Maven and Perlmeister being some good places to look, and Dan Massey has some interesting articles on his site too. Not only that, but it is extensible too, with plenty of extra modules to be found on CPAN.

PHP

This usurper has taken the place of Perl for powering many of the world’s websites. That the language is less verbose probably helps its case and many if not most CMS packages make use of its versatility.

Python

It may be Google’s preferred scripting language for system administration but it is its usefulness for Data Science where it really has shone in the eyes of many. There are numerous packages for data wrangling, data visualisation and machine learning that make the language ever present in any Data Scientist’s toolbox and looking in the PyPi archive will allow you to find what you need. It also has its place in web scripting too, even if it is not as pervasive as PHP though CMS’s like Plone run on Python and there is the Django framework together with the Gunicorn web server.

OpenJDK

One of the acts of Jonathon Schwartz while he was head at Sun Microsystems was to make Java open source after more than a decade of its being largely proprietary and this is the website for the project. Of course, his more notable act at Sun was to sell the company to Oracle, but that’s another story altogether…

R

This is an open-source implementation of the S language that is much appreciated by statisticians and is much used in the teaching of the subject. The base language only has so much functionality but there are many packages available that do just that and there are many to find on repositories like the CRAN and others can be found on various GitHub repositories, though these tend to be more experimental in nature. There are commonly used and well-supported mainstays that everyone uses, but there always is a need to verify that a particular package does what it claims to do. Given that, there are possibilities for data wrangling, data tabulation, data visualisation and data science. While quick to code, R is slow to execute compared with others and I have found that Python is faster but it still has a use for smaller data sets; both keep their temporary data sets in system memory so that will help.

Rust

It came as a surprise that this Mozilla-originated language is gaining traction in scientific data analysis, possibly because it is a fast multithreaded counterpart to C and C++ with some added safety features (though these can be turned off if needed and extra care gets taken). The downsizing of Mozilla led to a sharp reduction in its team of Rust developers and the Rust Foundation has been set up to oversee the language instead. There are online books like The Rust Programming Language and the Rust Cookbook, with the first of these also having paper and e-book counterparts from No Starch Press. For those interested in a more interactive introduction, there also is the Tour of Rust.

Databases

MariaDB

This essentially is a fork of MySQL (see below) now that Oracle owns it. The originators of MySQL are the creators of MariaDB so their claims of it being a drop-in replacement for it may have some traction. So far, I have seen no exodus from MySQL, though.

MySQL

After being in the hands of a number of owners until it incongruously came into the custodianship of Oracle (who of course already had and still have one of their own), the database system that powers many dynamic websites almost remains a de facto standard and looks set to remain thus for now.

MongoDB

This may a document-based and not a relationship database like many of us understand them but it still is being touted as an alternative to the more mainstream competition. Database technology isn’t just about SQL and MongoDB champions a NoSQL approach; it sounds as if the emergence of XML might be what’s facilitating the NoSQL database technologies.

PostgreSQL

This project may have more open-source credibility than MySQL, but it seems to remain in its shadow, though that may be explained by its being a more complex piece of software to use (at least, that has been my experience, anyway). It so happens that this is what Debian installs if you specify the web server option at operating system installation time.

Refurbished Computers

12th July 2014

Refurbished Computers

While I never have been a home user of refurbished or second hand kit, there are those who do and there do appear to be some bargains to be had. For some reason, I get the sense that computing and photographic hardware seems to heading more upmarket as time goes on so it may be that this becomes the only way of getting cheaper computers unless you stick with Chromebooks and their like. Interestingly, the now defunct Micro Mart magazine did a feature on the subject and even Apple has legitimised the idea with its presence.

Manufacturers

Apple

With the premium reputation that Apple has, the chance of bagging any sort of a bargain from them is too good to overlook and they have had a refurbished goods store for longer than many. There are no iPhones here but Macs, iPads and iPods are made available in this way so it is worth a look. The chance of a cheaper Mac of some sort is a tempting idea.

Dell

A colleague of mine at work swears by this so much that it is where he looked when buying a laptop for his father. There are home and business sections too so even servers are available along with laptop and desktop PC’s as well as tablets.

Resellers

eBuyer

This is a computer kit reseller who I have never used so far but there have been qualms expressed about their customer service. Like many, they too have a clearance section so it may be worth a look if fancy taking a little risk.

Morgan

The mainstay of this lot are pre-used computers and they have been around a while too, even if they disappeared from the web for a while at one stage. They also had a shop near Manchester’s Piccadilly train station though I am left wondering if any of the apparent bargains tempted anyone.

Specialists

Giga Refurb
MicroDream
Pure IT Refurbished
Tier1online.com
Itzoo

These have the quality of their work approved by Microsoft themselves so there should be some confidence here. With Microsoft having put Windows XP out to grass, Windows 7 is being promoted on machines with at least Intel Core 2 Duo CPU’s and prices can be very reasonable too.

Command line mapping of network drives

5th September 2007

Mapping network drives in Windows usually involves shuffling through Explorer menus. There is another way that I consider to be neater: using the Windows command line ("DOS" to some). The basic command for creating a mapping goes like this:

net use w: \\yourserver.address

To ensure persistence of the mapping across different Windows sessions, use this:

net use w: \\yourserver.address /persistent:yes

Here’s how to set up a mapping that logs in as a different user:

net use w: \\yourserver.address password /user:you

The above can include domain information as well and in a number of different forms: domain\username is one.

To delete a mapping, try this:

net use w: /delete

List all existing mappings:

net use

This is a flavour of what is available and Microsoft does provide documentation. Issuing the following command will bring some of that on the command line:

net help use

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.